Data fitting with signomial programming compatible difference of convex functions
نویسندگان
چکیده
Abstract Signomial Programming (SP) has proven to be a powerful tool for engineering design optimization, striking balance between the computational efficiency of Geometric (GP) and extensibility more general methods optimization. While techniques exist fitting GP compatible models data, no have been proposed that take advantage increased modeling flexibility available in SP. Here, new Difference Softmax Affine function is constructed by utilizing existing Convex (DC) functions. This class fit data log–log space becomes either signomial or set signomials upon inverse transformation. Examples presented here include simple test cases 1D 2D, performance NACA 24xx family airfoils. In each case, RMS error driven less than 1%.
منابع مشابه
Difference of Convex Functions Programming Applied to Control with Expert Data
This paper reports applications of Difference of Convex functions (DC) programming to Learning from Demonstrations (LfD) and Reinforcement Learning (RL) with expert data. This is made possible because the norm of the Optimal Bellman Residual (OBR), which is at the heart of many RL and LfD algorithms, is DC. Improvement in performance is demonstrated on two specific algorithms, namely Reward-reg...
متن کاملDifference of Convex Functions Programming for Reinforcement Learning
Large Markov Decision Processes are usually solved using Approximate Dynamic Programming methods such as Approximate Value Iteration or Approximate Policy Iteration. The main contribution of this paper is to show that, alternatively, the optimal state-action value function can be estimated using Difference of Convex functions (DC) Programming. To do so, we study the minimization of a norm of th...
متن کاملSolving Indefinite Kernel Support Vector Machine with Difference of Convex Functions Programming
Indefinite kernel support vector machine (IKSVM) has recently attracted increasing attentions in machine learning. Different from traditional SVMs, IKSVM essentially is a non-convex optimization problem. Some algorithms directly change the spectrum of the indefinite kernel matrix at the cost of losing some valuable information involved in the kernels so as to transform the non-convex problem in...
متن کاملOn Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions
Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...
متن کاملShape Fitting and Non Convex Data Analysis
This contribution addresses the problem of curve and surface evolution. We explain a general framework for the evolution-based approximation of a given set of points by a curve. Then we apply this method to surfaces. We show the sequential evolution of curves and surfaces on some concrete examples. We apply the curve evolution as a solving method in statistical data analysis. Our aim is to use ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Optimization and Engineering
سال: 2022
ISSN: ['1389-4420', '1573-2924']
DOI: https://doi.org/10.1007/s11081-022-09717-4